-
Notifications
You must be signed in to change notification settings - Fork 239
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove test checks for Spark versions before 3.2.0 #11316
base: branch-25.02
Are you sure you want to change the base?
Remove test checks for Spark versions before 3.2.0 #11316
Conversation
Signed-off-by: Sameer Raheja <[email protected]>
Signed-off-by: Sameer Raheja <[email protected]>
Signed-off-by: Sameer Raheja <[email protected]>
Signed-off-by: Sameer Raheja <[email protected]>
…into branch-24.10-remove_3_1_test_checks
@@ -118,7 +105,7 @@ def test_cast_string_date_invalid_ansi(invalid): | |||
|
|||
|
|||
# test try_cast in Spark versions >= 320 and < 340 | |||
@pytest.mark.skipif(is_before_spark_320() or is_spark_340_or_later() or is_databricks113_or_later(), reason="try_cast only in Spark 3.2+") | |||
@pytest.mark.skipif(is_before_spark_340() or is_databricks113_or_later(), reason="try_cast only in Spark 3.2+") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Isn't this supposed to be is_spark_340_or_later instead of is_before_spark_340?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
2024 copyrights
@@ -224,8 +220,7 @@ def test_dpp_reuse_broadcast_exchange_cpu_scan(spark_tmp_table_factory): | |||
@pytest.mark.parametrize('s_index', list(range(len(_statements))), ids=idfn) | |||
@pytest.mark.parametrize('aqe_enabled', [ | |||
'false', | |||
pytest.param('true', marks=pytest.mark.skipif(is_before_spark_320() and not is_databricks_runtime(), | |||
reason='Only in Spark 3.2.0+ AQE and DPP can be both enabled')) | |||
pytest.param('true', marks=pytest.mark.skipif(not is_databricks_runtime(), reason='AQE+DPP not supported on Databricks')) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Previous condition was AND not OR and would only trigger on Spark 3.1.x
pytest.param('true', marks=pytest.mark.skipif(not is_databricks_runtime(), reason='AQE+DPP not supported on Databricks')) |
pytest.param('true', marks=pytest.mark.skipif(not is_databricks_runtime(), | ||
reason='AQE+DPP not supported on Databricks')) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pytest.param('true', marks=pytest.mark.skipif(not is_databricks_runtime(), | |
reason='AQE+DPP not supported on Databricks')) |
pytest.param('true', marks=pytest.mark.skipif(not is_databricks_runtime(), | ||
reason='AQE+DPP not supported on Databricks')) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pytest.param('true', marks=pytest.mark.skipif(not is_databricks_runtime(), | |
reason='AQE+DPP not supported on Databricks')) |
# special constant values | ||
pytest.param("\"" + optional_whitespace_regex + "(now|today|tomorrow|epoch)" + optional_whitespace_regex + "\"", marks=pytest.mark.xfail(condition=is_before_spark_320(), reason="https://github.com/NVIDIA/spark-rapids/issues/9724")), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This parameter should not have been deleted, just the xfail marker removed, e.g.:
# special constant values
"\"" + optional_whitespace_regex + "(now|today|tomorrow|epoch)" + optional_whitespace_regex + "\"",
@@ -192,8 +192,7 @@ def query_map_scalar(spark): | |||
|
|||
|
|||
@allow_non_gpu('WindowLocalExec') | |||
@datagen_overrides(seed=0, condition=is_before_spark_314() | |||
or (not is_before_spark_320() and is_before_spark_323()) | |||
@datagen_overrides(seed=0, condition=is_before_spark_322() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@datagen_overrides(seed=0, condition=is_before_spark_322() | |
@datagen_overrides(seed=0, condition=is_before_spark_323() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
2024 copyrights
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
2024 copyrights
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is_neg_dec_scale_bug_version should also be updated, as it's using spark_version()
comparisons directly against 3.1.1 and 3.1.3.
Spark versions before 3.2.0 are no longer supported. This PR removes the following test conditions
Fixes #11160